113 research outputs found

    Using segmented objects in ostensive video shot retrieval

    Get PDF
    This paper presents a system for video shot retrieval in which shots are retrieved based on matching video objects using a combination of colour, shape and texture. Rather than matching on individual objects, our system supports sets of query objects which in total reflect the user’s object-based information need. Our work also adapts to a shifting user information need by initiating the partitioning of a user’s search into two or more distinct search threads, which can be followed by the user in sequence. This is an automatic process which maps neatly to the ostensive model for information retrieval in that it allows a user to place a virtual checkpoint on their search, explore one thread or aspect of their information need and then return to that checkpoint to then explore an alternative thread. Our system is fully functional and operational and in this paper we illustrate several design decisions we have made in building it

    A 3-Tier Planning Architecture for Managing Tutorial Dialogue

    Get PDF
    Managing tutorial dialogue is an intrinsically complex task that is only partially covered by current models of dialogue processing

    Towards Computational Persuasion via Natural Language Argumentation Dialogues

    Get PDF
    Computational persuasion aims to capture the human ability to persuade through argumentation for applications such as behaviour change in healthcare (e.g. persuading people to take more exercise or eat more healthily). In this paper, we review research in computational persuasion that incorporates domain modelling (capturing arguments and counterarguments that can appear in a persuasion dialogues), user modelling (capturing the beliefs and concerns of the persuadee), and dialogue strategies (choosing the best moves for the persuader to maximize the chances that the persuadee is persuaded). We discuss evaluation of prototype systems that get the user’s counterarguments by allowing them to select them from a menu. Then we consider how this work might be enhanced by incorporating a natural language interface in the form of an argumentative chatbot

    Making effective use of healthcare data using data-to-text technology

    Full text link
    Healthcare organizations are in a continuous effort to improve health outcomes, reduce costs and enhance patient experience of care. Data is essential to measure and help achieving these improvements in healthcare delivery. Consequently, a data influx from various clinical, financial and operational sources is now overtaking healthcare organizations and their patients. The effective use of this data, however, is a major challenge. Clearly, text is an important medium to make data accessible. Financial reports are produced to assess healthcare organizations on some key performance indicators to steer their healthcare delivery. Similarly, at a clinical level, data on patient status is conveyed by means of textual descriptions to facilitate patient review, shift handover and care transitions. Likewise, patients are informed about data on their health status and treatments via text, in the form of reports or via ehealth platforms by their doctors. Unfortunately, such text is the outcome of a highly labour-intensive process if it is done by healthcare professionals. It is also prone to incompleteness, subjectivity and hard to scale up to different domains, wider audiences and varying communication purposes. Data-to-text is a recent breakthrough technology in artificial intelligence which automatically generates natural language in the form of text or speech from data. This chapter provides a survey of data-to-text technology, with a focus on how it can be deployed in a healthcare setting. It will (1) give an up-to-date synthesis of data-to-text approaches, (2) give a categorized overview of use cases in healthcare, (3) seek to make a strong case for evaluating and implementing data-to-text in a healthcare setting, and (4) highlight recent research challenges.Comment: 27 pages, 2 figures, book chapte

    Developing a predictive modelling capacity for a climate change-vulnerable blanket bog habitat: Assessing 1961-1990 baseline relationships

    Get PDF
    Aim: Understanding the spatial distribution of high priority habitats and developing predictive models using climate and environmental variables to replicate these distributions are desirable conservation goals. The aim of this study was to model and elucidate the contributions of climate and topography to the distribution of a priority blanket bog habitat in Ireland, and to examine how this might inform the development of a climate change predictive capacity for peat-lands in Ireland. Methods: Ten climatic and two topographic variables were recorded for grid cells with a spatial resolution of 1010 km, covering 87% of the mainland land surface of Ireland. Presence-absence data were matched to these variables and generalised linear models (GLMs) fitted to identify the main climatic and terrain predictor variables for occurrence of the habitat. Candidate predictor variables were screened for collinearity, and the accuracy of the final fitted GLM was evaluated using fourfold cross-validation based on the area under the curve (AUC) derived from a receiver operating characteristic (ROC) plot. The GLM predicted habitat occurrence probability maps were mapped against the actual distributions using GIS techniques. Results: Despite the apparent parsimony of the initial GLM using only climatic variables, further testing indicated collinearity among temperature and precipitation variables for example. Subsequent elimination of the collinear variables and inclusion of elevation data produced an excellent performance based on the AUC scores of the final GLM. Mean annual temperature and total mean annual precipitation in combination with elevation range were the most powerful explanatory variable group among those explored for the presence of blanket bog habitat. Main conclusions: The results confirm that this habitat distribution in general can be modelled well using the non-collinear climatic and terrain variables tested at the grid resolution used. Mapping the GLM-predicted distribution to the observed distribution produced useful results in replicating the projected occurrence of the habitat distribution over an extensive area. The methods developed will usefully inform future climate change predictive modelling for Irelan

    Building capacity for evidence informed decision making in public health: a case study of organizational change

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Core competencies for public health in Canada require proficiency in evidence informed decision making (EIDM). However, decision makers often lack access to information, many workers lack knowledge and skills to conduct systematic literature reviews, and public health settings typically lack infrastructure to support EIDM activities. This research was conducted to explore and describe critical factors and dynamics in the early implementation of one public health unit's strategic initiative to develop capacity to make EIDM standard practice.</p> <p>Methods</p> <p>This qualitative case study was conducted in one public health unit in Ontario, Canada between 2008 and 2010. In-depth information was gathered from two sets of semi-structured interviews and focus groups (n = 27) with 70 members of the health unit, and through a review of 137 documents. Thematic analysis was used to code the key informant and document data.</p> <p>Results</p> <p>The critical factors and dynamics for building EIDM capacity at an organizational level included: clear vision and strong leadership, workforce and skills development, ability to access research (library services), fiscal investments, acquisition and development of technological resources, a knowledge management strategy, effective communication, a receptive organizational culture, and a focus on change management.</p> <p>Conclusion</p> <p>With leadership, planning, commitment and substantial investments, a public health department has made significant progress, within the first two years of a 10-year initiative, towards achieving its goal of becoming an evidence informed decision making organization.</p

    Biomedical informatics and translational medicine

    Get PDF
    Biomedical informatics involves a core set of methodologies that can provide a foundation for crossing the "translational barriers" associated with translational medicine. To this end, the fundamental aspects of biomedical informatics (e.g., bioinformatics, imaging informatics, clinical informatics, and public health informatics) may be essential in helping improve the ability to bring basic research findings to the bedside, evaluate the efficacy of interventions across communities, and enable the assessment of the eventual impact of translational medicine innovations on health policies. Here, a brief description is provided for a selection of key biomedical informatics topics (Decision Support, Natural Language Processing, Standards, Information Retrieval, and Electronic Health Records) and their relevance to translational medicine. Based on contributions and advancements in each of these topic areas, the article proposes that biomedical informatics practitioners ("biomedical informaticians") can be essential members of translational medicine teams
    corecore